Linear Algebra Exam

EX1) True or False

Linear combination Notion

Mathematically, if you have two vectors and , a linear combination of these vectors can be expressed as:

where and are scalar values.

1) Linear combinations

In order to check whether a linear combination can result in a specific vector, we put the vectors as the column of a matrix and we solve the systems of linear equations that we have generated.

2) Linearity of transformation

In order for a transformation to be linear, it must satisfy 2 conditions.

3) Linear dependency check

Definition of

represents the vector space of all polynomials of degree 2 or less with real coefficients. The elements of are polynomials of the form:

where , and are real numbers.

In order to check we can either look at the thing and see that they are not multiples of each other:

or we gotta solve the following equation for and :

How to go from this to system of linear equations

You expand the thing:

Then you group by their power of .

If the only solution is and , then the set is linearly independent. If there are other(non-zero) solutions for ​ and/or , then the set is linearly dependent.

4) Checking for subspaces

WIP


EX2) Eigenvalues and Eigenvectors, Basis

1) Eigenvalues

Solve the equation given by:

Warning

We may need to re-learn decompositions here.

1) Eigenvectors

Solve the system of linear equations given by:

where is one of the eigenvalues that we discovered previously and is the corresponding eigenvector(which we are trying to find).

1) Basis for Eigenspaces

An eigenspace is the space related to one eigenvalue.
We have two cases:

  • We found multiple eigenvectors for a .
  • We found a single eigenvector for a .

In the case of one eigenvector, the basis is simply that eigenvector.
In the case of multiple eigenvectors(they should be linearly independent), the basis is .

2) Diagonalization and Invertible matrices

Note

A matrix is called a diagonal matrix if all of its off-diagonal entries are zero. For example:

where are real numbers.

Hint

Every symmetric matrix is diagonalizable.

A matrix is said to be symmetric if , where denotes the transpose of .

This entire process is called diagonalization, its goal is to find an invertible matrix and a diagonal matrix such that:

where:

  • : Matrix constructed by placing the eigenvectors of as its columns. If has linearly independent eigenvectors, then is invertible, and can be diagonalized.
  • : This is a diagonal matrix where the diagonal entries are the eigenvalues of . The order of the eigenvalues in D should match the order of the corresponding eigenvectors in P.


EX3) Basis for Row space and Column space, Rank, Nullity, Null space

1) Basis for Row space

Turn the matrix in R.E.F. then get the non-zero rows. Those will be the vectors of our basis.


1) Basis for Column space

You either re-use the matrix that we used in the row space, and get the columns that have the pivots(they are surely linearly independent).

Or you can use the same process of the row space on .

1) Rank

The rank is just the dimensionality of the column space(the number of vectors in its basis). In this case .

1) Nullity

The nullity of matrix The dimension of the null space.
We can compute it through the following relation:

2) Null space

The null space of a matrix is the set of vectors that go to 0 after the transformation:

We just need to solve the system of linear equations given by the formula above.

The free variables are and , so we set and .

We then group by the parameters and generate these two vectors:


EX4) Determinant properties, Invertibility, basis

Determinant properties

  • (if is invertible, so ).
  • (if is square).
  • (if is square).
  • (if and are square).

Solutions for invertibility

We just pose and solve the equation.